On the mean square error of randomized averaging algorithms

نویسندگان

  • Paolo Frasca
  • Julien M. Hendrickx
چکیده

This paper regards randomized discrete-time consensus systems that preserve the average on expectation. As a main result, we provide an upper bound on the mean square deviation of the consensus value from the initial average. Then, we particularize our result to systems where the interactions which take place simultaneously are few, or weakly correlated; these assumptions cover several algorithms proposed in the literature. For such systems we show that, when the system size grows, the deviation tends to zero, not slower than the inverse of the size. Our results are based on a new approach, unrelated to the convergence properties of the system: this independence questions the relevance in this context of the spectral properties of the matrices related to the graph of possible interactions, which have appeared in some previous results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Performance of Error Normalized Step Size LMS and NLMS Algorithms: A Comparative Study

This paper presents a Comparative Study of NLMS (Normalized Least Mean Square) and ENSS (Error Normalized Step Size) LMS (Least Mean Square) algorithms. For this System Identification (An Adaptive Filter Application) is considered. Three performances Criterion are utilized in this study: Minimum Mean Square error (MSE), Convergence Speed, the Algorithm Execution Time. The Step Size Parameter (μ...

متن کامل

Stochastic computational methods for parabolic equations with random data

Here f(x, t; ω), w0(x; ω) are random with uniformly bounded second moments and continuous sample functions, ω ∈ (Ω,A, P ), some suitable probability space. Equations (1), (2) are to be satisfied with probability one. Assume that f is mean square continuous in Q and mean square Hölder continuous in x uniformly in Q, w0 is mean square continuous in IRm. Then [1], the unique solution to the proble...

متن کامل

An Analytical Model for Predicting the Convergence Behavior of the Least Mean Mixed-Norm (LMMN) Algorithm

The Least Mean Mixed-Norm (LMMN) algorithm is a stochastic gradient-based algorithm whose objective is to minimum a combination of the cost functions of the Least Mean Square (LMS) and Least Mean Fourth (LMF) algorithms. This algorithm has inherited many properties and advantages of the LMS and LMF algorithms and mitigated their weaknesses in some ways. The main issue of the LMMN algorithm is t...

متن کامل

Performance Analysis of Adaptive Filtering Algorithms for System Identification

The paper presents a comparative study of NLMS (Normalized Least Mean Square), NVSS (New Variable Step Size) LMS (Least Mean Square), RVSS (Robust Variable Step Size) LMS, TVLMS (Time Varying Least Mean Square) and IVSS (Improved Variable Step Size) LMS adaptive filter algorithms. Four performances criterion are utilized in this study: Minimum Mean Square Error (MSE), Convergence Speed, Algorit...

متن کامل

Evaluating Quasi-Monte Carlo (QMC) algorithms in blocks decomposition of de-trended

The length of equal minimal and maximal blocks has eected on logarithm-scale logarithm against sequential function on variance and bias of de-trended uctuation analysis, by using Quasi Monte Carlo(QMC) simulation and Cholesky decompositions, minimal block couple and maximal are founded which are minimum the summation of mean error square in Horest power.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Automatica

دوره 49  شماره 

صفحات  -

تاریخ انتشار 2013